Aggregation Methods for Markov Reward Chains with Fast and Silent Transitions

نویسندگان

  • Jasen Markovski
  • Nikola Trcka
چکیده

We analyze derivation of Markov reward chains from intermediate performance models that arise from formalisms for compositional performance analysis like stochastic process algebras, (generalized) stochastic Petri nets, etc. The intermediate models are typically extensions of continuous-time Markov reward chains with instantaneous labeled transitions. We give stochastic meaning to the intermediate models using stochastically discontinuous Markov reward chains, for which there are two prominent methods for aggregation: lumping and reduction to a pure Markov reward chain. As stochastically discontinuous Markov reward chains are not intuitive in nature, we consider Markov reward chains extended with transitions that are parameterized by a real variable. These transitions are called fast transitions, when they are governed by explicit probabilities, and silent transitions, when the probabilities are left unspecified. In the asymptotic case when the parameter tends to infinity, the models have a behavior of a stochastic discontinuous Markov reward chains. For all Markovian models, we develop two aggregation methods, one based on reduction and another one based on lumping and we give a comparative analysis between them.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Compositionality for Markov reward chains with fast and silent transitions

A parallel composition is defined for Markov reward chains with stochastic discontinuity, and with fast and silent transitions. In this setting, compositionality with respect to the relevant aggregation preorders is established. For Markov reward chains with fast transitions the preorders are τ -lumping and τ -reduction. Discontinuous Markov reward chains are ‘limits’ of Markov reward chains wi...

متن کامل

Compositionality for Markov Reward Chains with Fast Transitions

A parallel composition is defined for Markov reward chains with fast transitions and for discontinuous Markov reward chains. In this setting, compositionality with respect to the relevant aggregation preorders is established. For Markov reward chains with fast transitions the preorders are τ -lumping and τ -reduction. Discontinuous Markov reward chains are ‘limits’ of Markov reward chains with ...

متن کامل

Analysing reward measures of LARES performability models by discontinuous Markov chains

This paper presents a new method for specifying and analysing Markovian performability models. An extension of the LARES modelling language is considered which offers both delayed and immediate transitions, as well as rate and impulse rewards on whose basis different types of reward measures can be defined. The paper describes the evaluation path, starting from the modular and hierarchical LARE...

متن کامل

Comparing Markov chains: Combining aggregation and precedence relations applied to sets of states

Numerical methods for solving Markov chains are in general inefficient if the state space of the chain is very large (or infinite) and lacking a simple repeating structure. One alternative to solving such chains is to construct models that are simple to analyze and that provide bounds for a reward function of interest. We present a new bounding method for Markov chains inspired by Markov reward...

متن کامل

Comparing Markov Chains: Aggregation and Precedence Relations Applied to Sets of States, with Applications to Assemble-to-Order Systems

Solving Markov chains is in general difficult if the state space of the chain is very large (or infinite) and lacking a simple repeating structure. One alternative to solving such chains is to construct models that are simple to analyze and provide bounds for a reward function of interest. We present a new bounding method for Markov chains inspired by Markov reward theory: Our method constructs...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008